Skip to main content

Save Recorded Cases

Scenarios

AREX record real online traffic as a huge number of test cases, users can filter out some important cases from them, and save them as classic cases for reuse, as well as use them for regular regression validation.

tip

Cured use cases do not expire and remain valid forever.

Save test cases

  1. Select Replay in the sidebar and select the application you are working with.

    回放报告

  2. From the replay execution list, select the replay task in which you want to save test cases and view the recorded test cases by selecting Case Table.

    用例

    Selecting on the corresponding case, the Failed case (i.e., the case where the replay differs from the recorded return message) will show the specific details of the difference below.

    用例

    Point of difference:This displays all the difference nodes in the replay test. For main interface validation, it mainly verifies the difference between the recorded and replayed response messages. For the validation of other external calls to third-party dependencies, it verifies by comparing the request message, such as the SQL statement for the database.

    Scene Count:The number of scenes where the difference nodes occurs.

    Additional node:The additional node in the return message after recording or replay. If there are any, the difference points are highlighted in orange in the message.

    Difference node:Nodes that are different in the return message after recording and playback. If there are any, the difference points are highlighted in blue in the message.

    tip

    The interface shows all the test cases recorded during the recording start and end time set in this playback test. That is, all the recorded cases in the interval between Start Time and End Time in the figure below.

    开始回放

    Select Record Detail to display the details of the recording case in a new page:

    用例细节

    用例细节

    The validation of the main interface in AREX is mainly based on comparing the differences between the recorded and replayed response messages, while the validation of other external dependencies is validated by comparing their request message.

    The left side shows the request message sent to the external dependencies such as database calls, and the response messages of the main interface during recording.

    The right side shows the request message sent to the dependencies and the response messages of the main interface during replay.

  3. Select Save to save the case into the collection created previously.

    保存用例

    保存用例

View test cases

Once the test case is saved, you can view it under the corresponding collection request with a similar pattern as a request.

You can categorize your test cases by adding tags. Just place your cursor on Add Tag and select the icon.

查看用例

tip

Tags can be pre-configured by selecting the Edit Workspace icon >> Labels tab.

管理标签

The request URL is configured with the path of the interface by default, you can enter the test port in front of the path and send request to debug.

The arex-record-id in the request Headers is the recording ID, if you want to replay the case in test environment, you can add the arex-record-id in the request header of the new request, and send the request.

查看用例

The Mock tab shows all data and third-party dependencies mocked from the production environment during recording.

The left side displays the mocked request messages sent to the main interface and external dependencies, while the right side displays the corresponding response messages.

The mocked data can be edited, so if you are not satisfied with it, you can manually modify it and select "Save" to save it. Then, in next repaly, you can request by using the modified mock data.

查看用例

Replay saved cases

Replay saved cases refers to conducting replay testing and comparison only on the saved test cases. The purpose is to perform regular regression validation on the solidified cases, which can improve testing efficiency, coverage, and quality, while reducing testing costs.

AREX provides a Node.js script to implement regression testing for saved test cases of one app:

console.log('hello world')
const MongoClient = require('mongodb').MongoClient;
const mongoUrl = "mongodb://arex:iLoveArex@10.5.153.1:27017/arex_storage_db";
const scheduleEndpoint = 'http://10.5.153.1:8092/api/createPlan'
const appId = 'community-test-0905'
const targetHost = 'http://10.5.153.1:18080'

const client = new MongoClient(mongoUrl);


async function main() {
await client.connect();
const db = client.db();
console.log('Connected successfully to server');

// query auto pin cases
const pinedCases = await db.collection('PinnedServletMocker').find({appId: appId}).project({_id: 1, operationName: 1}).toArray();
const caseGroupByOp = new Map();
pinedCases.forEach(item => {
if (!caseGroupByOp.has(item.operationName)) caseGroupByOp.set(item.operationName, [])
caseGroupByOp.get(item.operationName).push(item._id)
})

// query service operations
const operations = await db.collection('ServiceOperation').find({appId: appId, operationName: {$in: [...new Set(pinedCases.map((item) => item.operationName))]}}).toArray()
// console.log(operations);

const createPlanReq = {
"appId": "community-test-0905",
"replayPlanType": 2,
"targetEnv": targetHost,
"operator": "AREX",
"operationCaseInfoList": operations.map(op => ({
"operationId": op._id.toString(),
"replayIdList": caseGroupByOp.get(op.operationName)
}))
}

const reqStr = JSON.stringify(createPlanReq, null, 4)
console.log(reqStr)

const response = await fetch(scheduleEndpoint, {
method: "POST", // *GET, POST, PUT, DELETE, etc.
headers: {
"Content-Type": "application/json",
},
body: reqStr,
});

return response.status;
}

main()
.then(console.log)
.catch(console.error)
.finally(() => client.close());

Once the script is executed, you can see the replay task on the reporting page: